翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Szilard engine : ウィキペディア英語版
Entropy in thermodynamics and information theory
There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by ''S'', of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as ''H'', of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in ''A Mathematical Theory of Communication''.
This article explores what links there are between the two concepts, and how far they can be regarded as connected.
==Equivalence of form of the defining expressions==

The defining expression for entropy in the theory of statistical mechanics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, is of the form:
: S = - k_B \sum_i p_i \ln p_i,\,
where p_i is the probability of the microstate ''i'' taken from an equilibrium ensemble.
The defining expression for entropy in the theory of information established by Claude E. Shannon in 1948 is of the form:
: H = - \sum_i p_i \log_b p_i,\,
where p_i is the probability of the message m_i taken from the message space ''M'' and ''b'' is the base of the logarithm used. Common values of ''b'' are 2, , and 10, and the unit of entropy is bit for ''b'' = 2, nat for ''b'' = , and dit (or digit) for ''b'' = 10.〔Schneider, T.D, (Information theory primer with an appendix on logarithms ), National Cancer Institute, 14 April 2007.〕
Mathematically ''H'' may also be seen as an average information, taken over the message space, because when a certain message occurs with probability ''p''''i'', the information
−log(''p''''i'') will be obtained.
If all the microstates are equiprobable (a microcanonical ensemble), the statistical thermodynamic entropy reduces to the form, as given by Boltzmann,
: S = k_B \ln W \,
where ''W'' is the number of microstates.
If all the messages are equiprobable, the information entropy reduces to the Hartley entropy
: H = \log_b |M|\,
where |M| is the cardinality of the message space ''M''.
The logarithm in the thermodynamic definition is the natural logarithm. It can be shown that the Gibbs entropy formula, with the natural logarithm, reproduces all of the properties of the macroscopic classical thermodynamics of Clausius. (See article: Entropy (statistical views)).
The logarithm can also be taken to the natural base in the case of information entropy. This is equivalent to choosing to measure information in nats instead of the usual bits. In practice, information entropy is almost always calculated using base 2 logarithms, but this distinction amounts to nothing other than a change in units. One nat is about 1.44 bits.
The presence of Boltzmann's constant ''k'' in the thermodynamic definitions is a historical accident, reflecting the conventional units of temperature. It is there to make sure that the statistical definition of thermodynamic entropy matches the classical entropy of Clausius, thermodynamically conjugate to temperature. For a simple compressible system that can only perform volume work, the first law of thermodynamics becomes
: dE = -p dV + T dS \,
But one can equally well write this equation in terms of what physicists and chemists sometimes call the 'reduced' or dimensionless entropy, σ = ''S''/''k'', so that
: dE = -p dV + k_B T d\sigma \,
Just as ''S'' is conjugate to ''T'', so σ is conjugate to ''kBT'' (the energy that is characteristic of ''T'' on a molecular scale).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Entropy in thermodynamics and information theory」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.